# 512 sequence length
Ro Bart 512
This is a BART base model pre-trained from scratch, specifically designed for Romanian text processing tasks.
Large Language Model
Transformers Other

R
Iulian277
27
0
T5 Base Dutch
Apache-2.0
This is a Dutch pre-trained model based on the T5 architecture, with 222 million parameters, trained on the cleaned Dutch mC4 dataset.
Large Language Model Other
T
yhavinga
102
6
Featured Recommended AI Models